BMC Nephrology
○ Springer Science and Business Media LLC
Preprints posted in the last 30 days, ranked by how well they match BMC Nephrology's content profile, based on 12 papers previously published here. The average preprint has a 0.08% match score for this journal, so anything above that is already an above-average fit.
Chong, K.; Litvinovich, I.; Argyropoulos, C.; Zhu, Y.
Show abstract
BackgroundRising kidney discard rates and uncertainty around accepting higher risk donor kidneys highlight the need for decision support tools that integrate donor and recipient factors and communicate risk in ways that are understandable and usable at the time of offer. Conventional indices (e.g., KDPI/KDRI) provide population level signals but do not deliver individualized, cognitively accessible information aligned with real time clinical workflows. ObjectiveTo describe how key transplant stakeholders (patients, coordinators, and providers) interpret and evaluate a prototype Kidney Risk Calculator app that generates donor-recipient specific survival projections and to identify the content, format and features, and functionality needed for clinically meaningful, patient-centered decision support. DesignQualitative study using focus groups and individual interviews. SettingUniversity of New Mexico Hospital (UNMH) Kidney Transplant Center. ParticipantsFive patients (four transplant candidates and one patient advocate), three transplant coordinators, and five transplant providers (3 attending physicians and 2 advanced practice practitioners). MethodsSemi-structured sessions (45 to 60 minutes) with 13 stakeholders (patients, coordinators, and providers) included a live app demonstration and explored usability, interpretability, contextual information needs, perceived clinical utility, and anticipated barriers/facilitators. Data were collected via one coordinator focus group, one patient focus group, and five provider interviews; sessions were recorded, transcribed, de-identified, and analyzed using inductive reflexive thematic analysis. ResultsStakeholders affirmed the value of personalized projections as an adjunct to clinical judgment, particularly for higher risk offers. Participants prioritized: 1) Content: clear education on hepatitis C virus (HCV) positive donors and Public Health Service (PHS) risk criteria; plain explanations of Calculated Panel Reactive Antibody (CPRA); and framing that makes time on dialysis and tradeoffs salient; 2) Format & Features: plain language narratives, percentages rather than decimals, simple visuals, minimized acronyms, U.S. customary units, and a stepwise (TurboTax-like) input flow preferred by patients; and 3) Functionality: attention to cognitive load and workflow alignment, given phone based time pressure and digital access constraints. Stakeholders emphasized that the value of the tool hinges on clarity, context, and workflow fit, not predictive accuracy alone. LimitationsSingle center, formative prototype study with a modest sample; findings are illustrative and may have limited transferability. Participants reacted to a demonstration rather than using the app during real time offer calls; convenience/email recruitment and Zoom only English sessions may introduce selection bias; team involvement in app development may contribute residual confirmation bias despite mitigation. ConclusionsEarly stakeholder input suggests that a kidney offer decision support tool should integrate individualized predictions with plain language explanations, contextual information that addresses common misconceptions, workflow aligned functionality, and accessible outputs. Tools designed and implemented with these features may support acceptance of medically complex kidneys and may help reduce offer bypass and organ discard. These inferences reflect stakeholder perceptions in a formative qualitative study and warrant prospective evaluation.
Hughes-Austin, J. M.; Claravall, L.; Katz, R.; Kado, D. M.; Schwartz, A. K.; Kent, W. T.; Girard, P.; Pereira, R. C.; Salusky, I. B.; Ix, J. H.
Show abstract
Individuals with chronic kidney disease (CKD) have higher rates of hip fracture and post-fracture mortality. Although they may develop age-related osteoporosis similar to those without CKD, they may also exhibit CKD-related metabolic bone disease (MBD), characterized by low, high, or mixed turnover at similar levels of bone mineral density (BMD). Because BMD does not provide information about turnover status, clinical decision-making is challenging. This study evaluated the associations between circulating bone-turnover biomarkers and static histomorphometry in patients undergoing hip-fracture surgery. In this cross-sectional study, we enrolled adults with and without CKD, defined as estimated glomerular filtration rate (eGFR) [≤]60 ml/min/1.73m{superscript 2} (CKD-EPI 2021), undergoing hip-fracture surgery. Blood samples, bone specimens from the femoral head or greater trochanter, and demographic and clinical data were collected at the time of surgery. Plasma biomarkers included -Klotho, bone alkaline phosphatase (BAP), dickkopf-related protein 1 (DKK-1), fibroblast growth factor 23 (FGF23), tartrate-resistant acid phosphatase 5b (TRAP5b), parathyroid hormone (PTH), and sclerostin. Logistic regression models, adjusted for age, gender, eGFR, and osteoporosis, assessed associations with CKD status. Tertiles of osteoblast surface (Ob.S/BS) and eroded surface (ES/BS) were defined in participants without CKD and applied to the full cohort. Multinomial and multivariable linear regression evaluated associations of biomarkers with these histomorphometry parameters. Among 97 enrolled participants (mean age 80 {+/-} 11 years; 67% female), 68% had CKD. Of 75 with complete biomarker and histomorphometry data, 96% demonstrated low bone turnover. CKD was associated with lower trabecular thickness (Tb.Th) and higher osteoid thickness (O.Th), osteoid volume (OV/BV), and osteoid surface (OS/BS), suggesting thinner, largely unmineralized trabeculae. Higher BAP (222.2% difference per doubling; 95% CI 77.2-485.8) and TRAP5b (319.3%; 95% CI 128.3-669.5) were directly associated with Ob.S/BS and ES/BS, whereas sclerostin was inversely associated with ES/BS (-28.9%; 95% CI -44.8 to -7.1). PTH was not associated with bone-turnover measures. These findings suggest that BAP, TRAP5b, and sclerostin may provide useful adjunct information alongside PTH for assessing bone turnover and guiding therapy in patients with and without CKD.
Lim, R. S.; Harris, T.; Jefferis, J.; Jahan, S.; Lim, R. S.; D Arrietta, L. M.; Ng, K. H.; Chin, H. L.; Goh, L. L.; Acharyya, S.; Chan, E. C. Y.; Patel, C.; Biros, E.; Sevdalis, N.; Mallett, A. J.
Show abstract
IntroductionGenomic testing is reshaping nephrology practice, yet the structure, outcomes, and implementation of kidney genetics services remain poorly characterized. MethodsWe conducted a two-part scoping study comprising (i) a literature review (JBI methodology, PRISMA-ScR compliant; OSF registration doi.org/10.17605/OSF.IO/N32VA) of English-language publications (2000-2025) describing kidney genetics services and outcomes, and (ii) an international stakeholder consultation of clinic leads to capture real-world service and implementation experiences. ResultsSixty studies were included, predominantly from North America (n=23), followed by United Kingdom/Ireland (n=5), Europe (n=17), Australia/New Zealand (n=10), and Asia (n=5). Among the 25 studies describing clinic models, four types were identified: multidisciplinary integrated (n=12), nephrologist-led (n=9), mainstreaming (n=2), and traditional genetics referral (n=2). Clinic structure varied by region. Outcome reporting focused on diagnostic yield (92%), with limited data on timeliness (16%), patient-reported outcomes (12%), or implementation outcomes (4%). Test penetration was high across regions and models, while diagnostic yield varied. Nephrologist-led clinics demonstrated comparable performance to multidisciplinary models when adequately supported. International stakeholder consultation data (n=48) revealed diversification of clinic models across regions. In Australia/New Zealand, multidisciplinary clinics predominated, supported by public funding and in-house or hybrid laboratory. United Kingdom/Ireland clinics used public funding and a national laboratory. North American clinics show greater heterogeneity, with higher prevalence of nephrologist-led models, reliance on commercial laboratories, and mixed or private funding. Asian clinics reported nephrologist-led models, with resource constraints, and hybrid testing and funding arrangements. Comprehensive sequencing with virtual panels predominated in Australia/New Zealand, United Kingdom, and Europe; phenotype-driven panels {+/-} reflex testing were more common in North America. ConclusionsKidney genetics care is expanding but remains unevenly implemented. Nephrologist-led and multidisciplinary models can be effective with appropriate support. Patient selection may influence diagnostic yield more than testing modality. Standardized outcome reporting and theory-driven implementation evaluation are essential for delivering equitable, sustainable genomic services. Lay SummaryThis study examined how kidney genetics services are delivered across the globe. We reviewed 60 studies (2000-2025) and consulted 48 clinic leaders globally. Four service models were identified--multidisciplinary integrated, nephrologist-led, mainstreaming, and traditional genetics referral--and mapped variation in care teams, test strategies, test laboratories, and funding. Most studies reported diagnostic yield, but few assessed patient experience or how well services were implemented. European programs showed the highest performance, attributed to clear referral criteria, deep phenotyping, detailed family histories, multidisciplinary review, and strong public funding. Clinics led by nephrologists performed comparably to multidisciplinary teams when adequately supported. Across all settings, patient selection was more important than the specific type of genetic test used in determining diagnostic success. Kidney genetics services are expanding but remain uneven. This study highlights the need for context-specific, theory-informed, and determinants-targeted strategies to support scalable, equitable, and sustainable genomic care worldwide.
Tefera, B.; Ali, R.; Megersa, B. S.; Girma, T.; Friis, H.; Abera, M.; Belachew, T.; Olsen, M. F.; Filteau, S.; Wells, J. C.; Wibaek, R.; Yilma, D.; Nitsch, D.
Show abstract
Introduction Glomerular filtration rate (GFR) is invasive to measure. Therefore, in clinical care, estimated GFR is derived from serum levels of endogenous filtration markers such as creatinine and cystatin C. Multiple studies from high income countries showed differences between estimated glomerular filtration rate based on cystatin C (eGFRcys) and creatinine (eGFRcr). This study aimed to assess the agreement between eGFRcys and eGFRcr in Ethiopian children and identify factors influencing higher eGFRcys and eGFRcr. Method We studied 350 Ethiopian children who were part of the iABC birth cohort study. At the recent follow-up (average age 10 years), serum cystatin C and creatinine were measured. Formulas by Berg (2015) and Hoste (2014) were used to estimate eGFRcys and eGFRcr, respectively, and Bland-Altman plots assessed their agreement. The difference in eGFR (eGFRdiff) was calculated and categorized as less than -15 mL/min/1.73 m2 (higher eGFRcr), between -15 and <15 mL/min/1.73 m2 (concordant), and greater than or equal to 15 mL/min/1.73 m2 (higher eGFRcys). Multinomial logistic regression was used to identify factors associated with higher eGFRcr and higher eGFRcys. Result Estimated glomerular filtration rate (eGFR) showed significant variation based on the estimation formula used. When using formulas by Berg (2015) and Hoste (2014), the median (IQR) eGFRcys and eGFRcr were 99.4 (90.0; 114.1), and 123.2 (110.3; 143.8) mL/min/1.73 m2, respectively. Overall, we observed a poor agreement between eGFRcys and eGFRcr, with only 94 (27.6%) children having concordant results compared to 220 (64.7%) with higher eGFRcr and 26 (7.6%) with higher eGFRcys. If the eGFRcys results are considered reliable, 27.5% of the children had eGFR below 90 mL/min/1.73 m2. Conclusion There was very marked variation in the distributions of estimated eGFRs depending on which formulas for children were used. Agreement between eGFR estimated using cystatin C and creatinine was poor among Ethiopian children. Relative to eGFRcys, kidney function may be overestimated by creatinine-based equation as up to 30ml/min in Ethiopia. Ideally, a validation study with GFR measured by gold standard methods (Inlulin clearance) among children is required. However, because of its invasive nature and financial concerns, Iohexol clearance studies are recommended.
Argoty Pantoja, A. D.; van der Most, P. J.; Kamali, Z.; Ganji-Arjenaki, M.; van der Vaart, A.; Vaez, A.; J.L. Bakker, S.; Snieder, H.; de Borst, M. H.
Show abstract
IntroductionGenome-wide association studies (GWAS) for kidney function have mainly focused on creatinine-based glomerular filtration rate (eGFRcrea), which is affected by variation in muscle mass. Moreover, the genetic basis of the sexual dimorphism of chronic kidney disease is underexplored. MethodsWe performed a GWA meta-analysis for creatinine clearance (CrCl), a muscle mass-independent kidney function phenotype, in 58,976 individuals of European descent from the Lifelines Cohort Study. ResultsWe identified 16 independent loci with 21 genome-wide significant lead single nucleotide polymorphisms (SNPs) associated with CrCl, two of which had not been reported previously in kidney function GWASs: rs146465192, located near the RP1-249F5.3 gene (effect allele frequency (EAF) = 0.01, P = 3.38 x 10-9) and rs117014836, located near the AGPAT4 gene (EAF = 0.02, P = 5.42 x 10-9). Both SNPs were also associated with eGFRcrea in Lifelines (rs146465192: P = 1.34 x 10-8; rs117014836: P = 3.64 x 10-7), but not in previously published eGFR GWASs. In silico follow-up analyses revealed that rs146465192 was associated with plasma IGF2R ({beta} = -0.519, P = 1.40 x 10-6), while rs117014836 was associated with blood expression levels of AGPAT4 (eQTL P = 6.54 x 10-6). Furthermore, we identified two female-specific CrCl loci (t-statistic P < 0.004): rs8002366 (GPC6) and rs12908437 (IGF1R), associated with GPC6 expression in kidney (eQTL P = 8.38 x 10-10) and IGF1R expression in blood (eQTL P = 2.62 x 10-6), respectively. ConclusionThis first large-scale GWAS of CrCl revealed two new genetic variants among both sexes and two female-specific variants influencing kidney function. Lay summaryKidney function is a complex phenotype influenced by many different factors, including genetics. Earlier genetic studies often used the creatinine-based estimated glomerular filtration rate (eGFRcrea) as the measure of kidney function. However, eGFRcrea is influenced not just by kidney function but also by an individuals muscle mass, which may distort the results. Therefore, in this study we used creatinine clearance (CrCl), a measure of kidney function independent of muscle mass, to look for genes in a European-ancestry population. We identified 16 genetic regions; two of which had not been found before. We also found two additional regions that were only related to CrCl in females. This shows the added value of investigating CrCl and suggests sex-based differences in how genetics affect kidney function.
AZAK, A.; Avsar, M. G.; Kocak, G.; Koyuncuoglu, A.; Kilickesmez, K.; Basci, O. K.; Avci, E.
Show abstract
IntroductionPatients with type 2 diabetes mellitus (T2DM) are at increased risk of coronary artery disease and frequently undergo coronary angiography or percutaneous coronary intervention. Although risk factors for post-contrast acute kidney injury (PC-AKI) are well defined, effective preventive strategies remain limited. MethodsThis multicenter observational cohort study included 975 patients aged 18-75 years who underwent coronary angiography and/or percutaneous coronary intervention with iodinated contrast between June 2023 and June 2024. All patients received standardized intravenous hydration. Participants were grouped according to chronic sodium-glucose co-transporter-2 (SGLT2) inhibitor use ([≥]3 months). PC-AKI was defined as a [≥]25% or [≥]0.5 mg/dL increase in serum creatinine within 48-72 hours after contrast exposure. ResultsThe mean age was 59.2 {+/-} 11.7 years, and 70.8% were male; 16.9% were using SGLT2 inhibitors. PC-AKI occurred in 7.3% of patients, and 0.7% required renal replacement therapy. In univariate analysis, advanced age, diabetes, hypertension, heart failure, diuretic use, and elevated urea, creatinine, potassium, and uric acid levels were associated with PC-AKI. Higher eGFR, albumin, sodium levels, and SGLT2 inhibitor use were inversely associated. In multivariate analysis, age [≥]65.5 years (OR 4.53), diabetes (OR 2.49), and uric acid >6.75 mg/dL (OR 2.34) remained independent risk factors, while eGFR >81.5 mL/min/1.73 m2 (OR 0.38), sodium >137.5 mmol/L (OR 0.36), and SGLT2 inhibitor use (OR 0.09) were independently protective. ConclusionBeyond established cardioprotective and renoprotective effects, SGLT2 inhibitors may reduce the risk of PC-AKI in patients with T2DM, potentially through decreased renal oxygen consumption and attenuation of contrast-induced hypoxic injury.
Caplin, B.; Agarwal, S.; Day, A.; Al-Rashed, A.; Oomatia, A.; Gonzalez-Quiroz, M.; Pearce, N.
Show abstract
IntroductionThere remains considerable debate as to the cause of the epidemic of Mesoamerican Nephropathy (MeN). We have previously reported early loss of estimated glomerular filtration rate (eGFR) as a surrogate for disease onset in a population-representative cohort study of young-adults at risk of disease from Northwest Nicaragua. Using a nested case-control approach we analysed urine and serum proteins surrounding this timepoint with the aim of gaining insight into the primary disease aetiology. MethodsWe conducted label-free ultra high-performance liquid chromatography mass-spectrometry based proteomics using urine samples collected at the study visit before, and at, first observed eGFR loss amongst cases and compared results to matched controls. We then performed direct protein measurements in a discovery cohort followed by quantification of serum total immunoglobulin E (stIgE) at multiple timepoints in a replication cohort. ResultsProteomic analysis demonstrated no differences in the levels of any single protein between cases and controls (n=25 each), at either timepoint, after correction for multiple comparisons. However, functional enrichment analysis demonstrated upregulation of adaptive immune pathways amongst cases. Direct measurements in the discovery cohort using high-sensitivity PCR-based immunoassay (n=21 controls, 19 cases) demonstrated higher stIgE in cases at the study visit immediately prior to first observed eGFR loss (mean difference 810kU/L, 95% confidence interval (CI): 162-1457kU/L). In the replication cohort (n=22 cases, 21 controls) an stIgE level >500kU/L measured by electrochemiluminescence in study samples from any timepoint in the 3 years prior to the first observed loss of eGFR was independently associated with case status when compared to samples from controls at matched visits (adjusted Odds Ratio: 8.1, 95% CI: 1.4-47.8). DiscussionA high level of stIgE precedes loss of eGFR in those at risk of MeN. Understanding what leads to this rise is likely to be key to understanding the cause of the MeN epidemic. Lay SummaryMesoamerican nephropathy describes an epidemic-level chronic kidney disease impacting rural working age adults in Central America. Although a number of exposures, including occupational heat exposure, have been proposed the cause of the epidemic, there remains much debate as to the primary aetiology of the disease. In this study we interrogated urine and blood samples from individuals from affected communities at risk of disease both before and after they develop kidney dysfunction. Using two different approaches, analysis of both urine and blood samples provide evidence of upregulation of immunoglobulin-E (IgE) related pathways in the 2-3 years before individuals develop evidence of kidney disease. Infections (particularly those involving parasites) and allergic reactions, but not heat exposure, have been reported to increase IgE levels. Going forwards, understanding the cause of this increase in IgE in individuals at risk of disease is likely to provide insight into the cause of Mesoamerican Nephropathy epidemics.
Limonte, C. P.; Schaub, J. A.; Fallegger, R.; Menon, R.; Schmidt, I. M.; de Boer, I. H.; Parikh, C.; Alpers, C. E.; Caramori, M. L.; Rosas, S.; Mottl, A.; Brosius, F.; Tuttle, K.; Lash, J.; Saez-Rodriguez, J.; Mariani, L. H.; Ricardo, A. C.; Eadon, M. T.; Ju, W.; Henderson, J.; Barisoni, L.; Hodgin, J. B.; Zelnick, L. R.; Sharma, K.; Spraggins, J.; Srivastava, A.; Schrauben, S.; Weir, M.; Hsu, C.-y.; Kelly, T.; Taliercio, J.; Rincon-Choles, H.; Dubin, R.; Cohen, D. L.; Xie, D.; Chen, J.; He, J.; Anderson, A. H.; Kretzler, M.; Himmelfarb, J.; And the CRIC Study Investigators, ; And the Kidney
Show abstract
BackgroundThe Kidney Precision Medicine Project (KPMP) consortium aims to redefine chronic kidney disease (CKD) by integrating clinical, pathological, and molecular tissue data from kidney biopsies. Here, we demonstrate how biopsy data in CKD can clarify disease etiology and contribute to understandings of disease pathophysiology and clinical prognosis. MethodsThe KPMP is obtaining research kidney biopsies from individuals with CKD (defined as an estimated glomerular filtration rate [eGFR] < 60 mL/min/1.73m2 and/or albuminuria [≥]30 mg/g creatinine) and diabetes (enrolled as diabetes and CKD or DKD) or hypertension (enrolled as hypertension and CKD or HCKD). A team of kidney pathologists and nephrologists adjudicated the primary clinico-pathological diagnosis for 258 participants with CKD. We compared pathological features and kidney transcriptional signatures between participants with a primary adjudicated diagnosis of diabetic nephropathy and those with other causes of CKD. We developed a model using clinical and biomarker data that predicted the probability of diabetic nephropathy and tested associations of the signature with CKD progression among Chronic Renal Insufficiency Cohort (CRIC) participants with diabetes (n=229). ResultsAmong 183 participants enrolled as DKD, 102 (56%) had a primary adjudicated clinico-pathologic diagnosis of diabetic nephropathy. Among 75 participants enrolled as HCKD, 42 (56%) had a primary diagnosis of hypertension-associated kidney disease. Those with diabetic nephropathy, compared with other diagnoses, had more severe interstitial fibrosis, tubular atrophy, tubular injury, segmental sclerosis, and severe arteriolar hyalinosis, and single-nucleus and single-cell transcriptional analyses revealed upregulation of immune and inflammatory pathways and downregulation of oxidative phosphorylation. A combination of age, hemoglobin A1c, urine albumin-creatinine ratio, and serum KIM-1 and sTNFR1 predicted a clinico-pathologic diagnosis of diabetic nephropathy in the KPMP (AUC 0.82, 95% CI 0.75-0.89) and was associated with an increased risk of CKD progression among patients with diabetes enrolled in CRIC (HR 1.48 [95% CI 1.27-1.73] per 10% higher predicted probability of diabetic nephropathy). ConclusionIn common presentations of CKD, kidney biopsies may alter a priori impressions, reveal a diversity of diagnosis, structure, and function that is associated with clinical outcomes and can impact therapeutic decisions.
Njipouombe Nsangou, Y. A.; Haug, S.; Ulmer, M. A.; Bellur, O.; Römisch-Margl, W.; Dönitz, J.; Köttgen, A.; Arnold, M.; Kastenmüller, G.
Show abstract
BackgroundKidney disease refers to a broad range of disorders that impair renal structure and function. Among these, chronic kidney disease (CKD) is the most prevalent worldwide, affecting approximately 10% of the global adult population. While large-scale omics studies have identified numerous molecular associations with kidney function and disease, these insights often remain isolated within individual data layers, hindering a systems-level understanding of the functional interplay between genes, proteins, metabolites and clinical phenotypes. MethodsWe developed the Kidney Disease Atlas (KD Atlas) using an extended QTL-based integration strategy combined with a composite network approach. For this purpose, we leveraged results from omics studies in population-based and kidney disease-specific cohorts from the CKDGen Consortium and other large-scale initiatives and integrated them with data from knowledge databases, inferring a comprehensive network of relationships between metabolites, proteins, genes, and kidney disease-related traits. ResultsWe present the KD Atlas, an online resource (https://metabolomics.helmholtz-munich.de/kdatlas) integrating over 25 large studies providing disease-relevant information on 20,456 protein-coding genes, 1,962 proteins, 1,375 metabolites and 40 kidney disease phenotypes connected by more than 1.2 million relationships. Through an interactive web interface, researchers can dynamically construct context-specific molecular subnetworks and perform integrated analyses without requiring specialized bioinformatics expertise. Application showcases demonstrate the resources utility for providing the molecular context of KD-associated genes or metabolites and for generating novel mechanistic hypotheses. ConclusionThe KD Atlas provides a global, multi-omics network view of kidney pathophysiology through an intuitive interface, empowering researchers to formulate mechanistic hypotheses and prioritize candidate targets for subsequent experimental validation.
Fallegger, R.; Gomez-Ochoa, S. A.; Boys, C.; Ramirez Flores, R. O.; Tanevski, J.; Pashos, E.; Feliers, D.; Piper, M.; Schaub, J. A.; Zhou, Z.; Mao, W.; Chen, X.; Sealfon, R. S. G.; Menon, R.; Nair, V.; Eddy, S.; Alakwaa, F. M.; Pyle, L.; Choi, Y. J.; Bjornstad, P.; Alpers, C. E.; Bitzer, M.; Bomback, A. S.; Caramori, M. L.; Demeke, D.; Fogo, A. B.; Herlitz, L. C.; Kiryluk, K.; Lash, J. P.; Murugan, R.; O'Toole, J. F.; Palevsky, P. M.; Parikh, C. R.; Rosas, S. E.; Rosenberg, A. Z.; Sedor, J. R.; Vazquez, M. A.; Waikar, S. S.; Wilson, F. P.; Hodgin, J. B.; Barisoni, L.; Himmelfarb, J.; Jain, S.;
Show abstract
Acute kidney injury (AKI) and chronic kidney disease (CKD) are two interconnected clinical conditions, both defined by degree of functional impairment, but with heterogeneous clinical trajectories. Using new transcriptomic technologies, recent studies have described the cellular diversity in the healthy and injured kidney at the single cell level. Here, we used single nucleus transcriptomics to investigate the molecular diversity and commonalities in kidney biopsies from over 150 participants with AKI and CKD enrolled within the Kidney Precision Medicine Project (KPMP), and did so at the patient participant level. Using an unsupervised approach, we identified two multi-cellular programs associated with clinical and histopathological features of acute injury and chronic damage, respectively. We found that these programs are expressed across patients with AKI and CKD, supporting shared, rather than distinct, underlying molecular mechanisms. These programs capture tissue-level compositional changes towards adaptive and failed-repair states in tubular epithelial cells, as well as intra-cellular molecular changes characteristic of stress in all cell types. We identified subunits of the NFkB and AP-1 complexes, as well as members of the STAT family, as putative upstream regulators of the acute and chronic programs. We were able to link these continuous molecular measures of acute injury and chronic damage with urine and plasma protein profiles obtained at time of biopsy. These non-invasive protein signatures were predictive of renal outcomes in an independent cohort of 44 thousand participants from the UK biobank. In summary, unbiased identification of cellular programs in kidney disease biopsies defined molecular programs of injury cutting across conventional disease categorisation and established a non-invasive molecular link to long term patient outcomes.
Gispert Martinez, M.; Chorda Sanchez, M.; Rosello Castells, O.; Ruiz Arranz, A.; Castillo Garcia, J.
Show abstract
ObjectiveTo analyze the experience of the last six years with ECMO in Uncontrolled Donation after Circulatory Death (uDCD), assessing the clinical and logistical factors that determine donation effectiveness and the viability of retrieved organs, with the nurse perfusionist as the central figure in organ perfusion. MethodsRetrospective observational study of uDCD procedures performed at Hospital Clinic de Barcelona between June 2019 and October 2025. ResultsOf 184 out-of-hospital ECMO-CPR activations, 108 (58.7%) underwent perfusion; 72 donor cases (66.7%) were generated, and 109 kidneys (75.7%) and 3 livers (4.15%) were retrieved. The annual number of uDCD donors was heterogeneous. Compared with non-effective donors, effective donors were significantly younger (48.1 {+/-} 12.4 vs 53.0 {+/-} 10.7 years, p=0.03) and had fewer comorbidities such as hypertension (13.8% vs 33.0%, p=0.018) and diabetes (4.1% vs 16.6%, p=0.027). Although effective donors had a shorter cannulation time (25.6 {+/-} 13.9 vs 29.1 {+/-} 11.9 min, p=0.09), the difference was not statistically significant; however, cardiocompressor time did show a significant difference (58.9 {+/-} 17.7 vs 65.8 {+/-} 18.2 min, p=0.03). ConclusionsuDCD was a useful source of transplantable organs, mainly kidneys (two out of every three perfused patients became donors), in the current context of scarcity of brain-dead donors. Shorter warm ischemia times (cardiocompressor and cannulation times) were significantly associated with more effective organ donation. The multidisciplinary transplant team may benefit from perfusion professionals with expertise in extracorporeal oxygenation therapy.
Costa-Santos, C.; Vidal, R.; Lisboa, S.; Vieira-de-Castro, P.; Monteiro, A.; Duarte, I.
Show abstract
Compassion fatigue is a well-documented hazard among healthcare and veterinary professionals, yet the psychological toll on informal caregivers of feral cat colonies, likely numbering several tens of thousands in Portugal, remains largely unexplored. This cross-sectional study examines internal and external factors associated with the secondary traumatic stress component of compassion fatigue among 172 informal caregivers in Portugal. Secondary traumatic stress refers to work-related secondary exposure to individuals who have experienced extremely stressful or traumatic events. Structured telephone interviews assessed sociodemographics, colony management, compassion satisfaction, resilience, spiritual well-being, and perceived social support. Univariate and multivariable linear regression identified predictors of compassion fatigue. Results indicate that 47% of participants experienced moderate compassion fatigue, and 10% reported high levels. Multivariable analysis revealed that caring for large colonies (more than 25 cats) and being unemployed were significantly associated with higher fatigue. Conversely, older age, higher perceived family support, and the resilience dimension of serenity served as protective factors. Interestingly, finding meaning in life was positively correlated with fatigue, suggesting that caregivers who perceive their role as central to their life purpose may become more emotionally invested, increasing vulnerability to distress when unable to help animals. Official colony registration and formal institutional support did not significantly alleviate fatigue. These findings highlight that institutional support alone is insufficient to mitigate fatigue among informal caregivers, who experience significant distress driven by both practical burdens and profound emotional involvement. The most frequently reported concern among caregivers was the inability to cover the costs of feeding and veterinary care for the cats. Interventions must address both external needs (e.g., support to cover veterinary and feeding expenses for the cats) and internal coping mechanisms. Implementing psychosocial support alongside trap-neuter-return programs may also improve caregiver well-being and foster sustainable urban feral cat management. This underscores a One Health perspective, demonstrating that animal health is closely interconnected with human well-being and environmental health.
Sarang, S.; Matingo-Mutava, E.; Cassim, N.
Show abstract
BackgroundThe COVID-19 pandemic required South African public sector HIV viral load (VL) laboratories to scale up Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) testing while maintaining essential HIV services. This placed additional pressure on diagnostic services. This dual mandate introduced significant occupational and environmental challenges (OEC) for staff that remain underexplored. ObjectiveThis study aimed to investigate the OEC and effects that staff experienced during the implementation of COVID-19 testing at public sector VL laboratories in South Africa. MethodsA quantitative, cross-sectional study utilised a census approach among technical and support staff. Data were collected via a structured REDCap questionnaire using 5-point Likert scales. Pre- and post-implementation challenges were assessed across four domains: workload, environmental conditions (space, ventilation, waste), communication, and PPE availability. Statistical analyses included the Wilcoxon Signed-Rank and Spearmans correlation tests. ResultsPerceived occupational challenges increased significantly across all domains post-implementation. Staff workload saw the highest rise (mean score 3.02 to 3.53). Adverse health effects were pervasive; 80.2% of staff reported burnout/fatigue, and 76.5% reported increased anxiety/stress. A strong positive correlation was observed between post-COVID-19 challenges and adverse mental and physical health outcomes (rho = 0.449, p < 0.001). Furthermore, 35.8% of staff considered resigning due to increased job demands. ConclusionIntegrating COVID-19 testing exacerbated systemic weaknesses, causing measurable psychological injury and threatening workforce retention. Findings suggest that the diagnostic workforce requires formal crisis surge staffing models and institutionalised mental health support to safeguard personnel and maintain essential services during future health emergencies.
Shahriyar, A.; Hanifi, S. M. M. A.; Rahman, S. M.
Show abstract
BackgroundDengue outbreaks have become a severe threat to Bangladesh as the infections and mortality numbers are skyrocketing in recent years. Favorable environmental and anthropogenic conditions have established the capital of Bangladesh, Dhaka city as the epicenter of dengue outbreak. Studies have showed that climate change induced extreme weather events are exacerbating Aedes mosquito breeding and dengue virus transmission conditions. Methodology/Principal FindingsIn this study, short-term (0-6 weeks) associations of maximum temperature and heatwave days on dengue cases in Dhaka city were examined through Distributed Lag Non-linear Model (DLNM) methodology for weekly measurement of 2016-2024, taking into account relative humidity, cumulative rainfall, seasonality and hospital closure effect. Two separate negative binomial models were constructed. The maximum temperature model rendered an overall inverted U-shaped association, where the maximum temperature range of 31.5-33.2{degrees}C showed a sustained elevated dengue risk, with highest risk estimate at 33.2{degrees}C [relative risk (RR): 1.186, 95% CI: 1.002, 1.403]. Whereas, results of weekly heatwave days showed an overall protective effect (RR<1) for dengue cases. The lowest risk of infection was found at 3 heatwave days per week, with RR 0.275 (95% CI: 0.178, 0.423). Multiple sensitivity analyses were conducted for both models to evaluate their robustness. Lastly, the optimized models were analyzed under three distinct sub-periods, to capture the association of exposure variables with predominant circulating serotypes. Conclusions/SignificanceThe findings of the study aim to support public health policymakers and healthcare authorities in designing and implementing effective vector control interventions under emerging climatic emergencies. Author SummaryDengue disease is one of the most buringing issue in Bangladesh in recent years. This vector-borne disease is inherently influenced by climatic variables, i.e., temperature, rainfall, humidity, etc. Moreover, these relations are complex and non-linearly associated. Due to shift in climatic conditions, the occurance of extreme weather events are becoming frequent, with increased magnitude and longer duration. In this study, the nonlinear and delayed association of dengue infections due to the exposure of extreme temperature events were assessed in climate-change vulnerable Dhaka city. To do this, a statistical method was used, called distributed lag nonlinear methodology (DLNM). The results showed that dengue infections had an inverted U-shaped (parabolic) relationship with maximum temperature, while compared to mean maximum temperature, and a suppressive association with heatwaves relative to days without heatwaves. The findings aim to work as an early warning system, and support to policymakes and healthcare authorities to tackle the dengue surge in the changing climate.
Fraser, J. J.; Zouris, J. M.; Hoch, J. M.; Sessoms, P. H.; MacGregor, A. J.; Hoch, M. C.
Show abstract
IntroductionMusculoskeletal injuries (MSKIs) are ubiquitous in the U.S. military, especially among high-performing service members such as Marines. Given that female service members only started to be assigned to ground combat roles since December 2015, evaluation of sex on MSKI risk in ground combat occupations has not been possible until there was an ample population to study. The purpose of this population-level epidemiological study was to assess (1) if female sex was a salient risk factor for MSKI in Marines serving in different military occupations, including combat arms, and (2) the effects of integration period on MSKI risk among female Marines. Materials and MethodsA population-based epidemiological retrospective cohort study of all U.S. Marines was performed assessing female sex, occupation, and integration period on the prevalence of MSKI from 2011 through 2020. The Military Health System Data Repository was utilized to identify initial healthcare encounters for diagnosed ankle-foot, knee, lumbopelvic-hip, thoracocostal, cervicothoracic, shoulder, elbow, or wrist-hand complex injuries. Prevalence was calculated for female and male Marines in each occupational category (combat, combat support, aviators, aviation support, services) during the pre-integration (2011-2015) and post-integration (2016-2020) periods. ResultsDuring the pre-integration period, 520/1,000 female Marines (n=13,985) and 299/1,000 male Marines (n=142,158) incurred MSKIs. In the post-integration period, the prevalence increased to 565/1,000 female Marines (n=17,608) and 348/1,000 male Marines (n=161,429). In the multivariable evaluation of sex, occupation, integration period, and the interaction of sex and occupation on combined MSKIs, only female sex was a significant factor for injury (prevalence ratio [PR]=1.99), with service in ground combat and aviation occupations identified as protective factors when compared with services occupations (PR=0.69). When these same factors were evaluated for specific MSKI outcomes, female sex remained a robust factor in all lower quarter (PR=1.75-2.63) and upper quarter (PR=1.38-2.36) injuries except for shoulder injuries. Service in ground combat and aviation occupations was protective for all lower quarter injuries (PR=0.46-0.71). In the upper quarter, ground combat was protective for all injuries except for elbow injuries (PR=0.67-0.77). Serving as an aviator was a risk factor for cervicothoracic (PR=1.57) and thoracocostal (PR=1.22) injuries and a protective factor for shoulder (PR = 0.73) and wrist-hand (PR = 0.46) injuries. Adjusted risk for lumbopelvic-hip (PR=1.13), ankle-foot (PR=1.53), cervicothoracic (PR=1.19), thoracocostal (PR=1.14), and elbow (PR=1.48) injuries significantly increased during the post-integration period. There was a significant sex-by-period interaction for shoulder injuries alone, with female sex in the post-integration epoch found to be salient (PR=1.26). ConclusionsFemale sex was a salient factor for MSKI, with service in ground combat and aviation occupations identified as protective factors when compared with services occupations. In the evaluation of specific MSKIs, female sex remained a robust and significant factor in all lower quarter injuries and upper quarter injuries except for shoulder injuries. There was only a significant sex-by-period interaction for shoulder conditions, with an increased risk of these injuries in female Marines in the post-integration period.
ncibi, k.
Show abstract
Food costs are more significantly impacted by climate change as countries grow. It is well known that climate change has an impact on the productivity of most agricultural goods, but it is unclear how specifically it will affect food costs. The present research explores how the North Atlantic Oscillation (NAO) index, a widely used climate indicator, affects food prices around the world. This is achieved by applying a robust bivariate Hurst exponent (robust bHe). The research creates a color map of this coefficient using a window-sliding technique over various intervals of time, displaying an illustration that changes overtime. Additionally, the NAO index and global food prices are examined for causal connections using variable-lag transfer entropy using a window-sliding technique. The results show that notable rises in a number of international food prices for long as well as short periods are associated with significant increases in the NAO index. Furthermore, the causative function of the NAO index in influencing global food costs is confirmed by variable-lag transfer entropy. Is highly recommended as it directly connects the research to actionable outcomes for policymakers and the overarching goal of sustainability and food security. This study provides the first direct evidence of a robust, long-range cross-correlation and causal link between the North Atlantic Oscillation (NAO) index and key global food prices. It introduces a novel, robust methodological framework to visualize this time-varying relationship, offering a critical tool for policymakers and forecasting models.
Guyett, A.; Dunbar, C.; Lovato, N.; Nguyen, K.; Bickley, K.; Nguyen, P.; Reynolds, A.; Hughes, M.; Scott, H.; Adams, R.; Lack, L.; Catcheside, P.; Pinilla, L.; Cori, J.; Howard, M.; Anderson, C.; Stevens, D.; Bensen-Boakes, D.-B.; Montero, A.; Stuart, N.; Vakulin, A.
Show abstract
BackgroundProlonged wakefulness, restricted sleep, and circadian factors can impact driving performance and road safety. Currently, there are no effective objective roadside tests to detect the state of drivers sleepiness during or prior to driving, or predict future driving impairment risk. This paper reports on an extended wakefulness protocol used to determine if a portable virtual reality device to administer vestibular-ocular motor function (VOM) tests can effectively detect 1) drivers state of sleepiness during or just prior to driving, and 2) predict trait sleepiness and future driving risk. MethodsFifty healthy adults with regular sleep within 9pm to 8am were recruited for an experimental laboratory procedure which involved two phases: an initial overnight sleep study, and a subsequent period of extended wakefulness lasting ~29 hours. During the wakefulness phase, participants undertook neurobehavioural testing, a simulated driving test, and repeat assessments of VOM to establish if ocular markers can predict sleepiness state and sleepiness-related performance impairments (Trial registry ACTRN12621001610820). DiscussionThis protocol outlined a study that aimed to establish the sensitivity of VOM test the effects of extended wakefulness and circadian phase on driver state and trait sleepiness and subsequent sleepiness-related driving impairment. Furthermore, the protocol aims to define the best VOM predictors to identify driver sleepiness state (road side testing and pre-drive assessments) and sleepiness trait (predicting future driving risk) to establish proof of concept for its potential application as a roadside, pre-drive and general sleepiness related fitness to drive test.
Cai, C.; Horm, D.; Fuhrman, B.; Van Pay, C. K.; Zhu, M.; Shelton, K.; Vogel, J.; Xu, C.
Show abstract
Abstract This protocol is reported in accordance with the SPIRIT 2025 guidelines for clinical trial protocols. Introduction: Young children, from birth to age 5 y are particularly vulnerable to indoor air pollutants and respiratory pathogens. Portable air purifiers (or filtration) and upper-room ultraviolet germicidal irradiation (UVGI) are two widely used interventions with the potential to improve indoor air quality (IAQ) and reduce sick-related absences. However, a review of the literature revealed no real-world randomized studies evaluating their effectiveness in reducing young children's sick-related absences in early care and education (ECE) classrooms. Methods and Analysis: The OK-AIR study is a longitudinal, cluster-randomized 2x2 factorial trial conducted in Head Start centers using two implementation cohorts: Cohort 1 (five Head Start centers and 20 classrooms from 2023 to 2024) and Cohort 2 (11 centers and 59 classrooms from 2025 to 2026), with expanded inclusion of rural areas. Cohort 1 enrolled 204 children, 48 teachers and 5 site directors, and Cohort 2 enrolled 462 children, 97 teachers and 11 site directors. Within each center, four classrooms are randomized to: (1) control; (2) portable filtration; (3) upper-room ultraviolet germicidal irradiation (UVGI); or (4) both interventions. Cohort 2 was initially planned as a second factorial trial but was amended to a purifier-only design due to funding changes; details are provided in the protocol amendments section. We collect continuous IAQ data, including particulate matter (PM) with aerodynamic diameters [≤]1 m (PM1), [≤]2.5 m (PM2.5), [≤]4 m (PM4), and [≤]10 m (PM10); total volatile organic compounds (TVOCs) index; nitrogen oxides (NOx) index; carbon monoxide (CO), noise; temperature; and relative humidity, alongside daily child absences. Seasonal environmental surface swabs (dining tables and toilet flooring) are tested by Reverse-Transcriptase quantitative Polymerase Chain Reaction (RT-qPCR) for Influenza A/B, Respiratory Syncytial Virus (RSV), Human Parainfluenza Virus Type 3 (HPIV3), Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2), and Norovirus. IAQ monitoring is structured across Winter, Spring, Summer, and Fall, including designated baseline/off-period weeks to characterize temporal and seasonal variability in environmental measures across classrooms and centers. Multi-informant surveys (Director, Teacher, Parent) capture contextual factors, and children's social-emotional development is assessed using teacher ratings on the Devereux Early Childhood Assessment (DECA). The primary outcome is the sick-related absence rate, analyzed as cumulative absences over the attendance year while accounting for clustering by school and classroom using generalized mixed-effects models. Secondary outcomes include children's social-emotional ratings, IAQ metrics and pathogen detection rates; analyses of IAQ incorporate time/seasonal structure, and season-stratified absenteeism analyses will be treated as secondary/exploratory refinements. An economic evaluation will estimate incremental intervention costs and cost-effectiveness/cost-benefit (such as cost per sick-related absence day averted). Ethics and Dissemination: This study was approved by the Institutional Review Board (IRB) at the University of Oklahoma. Findings will be shared through peer-reviewed publications; presentations at local, state, and national conferences; research briefs developed for lay and policy audiences; and community briefings prioritizing the participating early childhood programs and communities. ISRCTN Trial Registration: ISRCTN78764448 Disclaimer: The views expressed are those of the authors and do not reflect the official views of the Uniformed Services University or the United States Department of War. Strengths and Limitations of This Study: {middle dot} Real-world longitudinal cluster RCT: The study uses a rigorous longitudinal cluster-randomized 2x2 factorial design in real-world ECE settings. {middle dot} Combined interventions: Interventions target both air filtration and disinfection, allowing for combined and comparative evaluation. {middle dot} Objective air quality monitoring: Continuous monitoring of IAQ metrics provides objective and reliable data on environmental change. {middle dot} Environmental pathogen surveillance: qPCR on surface swabs yields an objective biological outcome to triangulate with IAQ and absences. {middle dot} Comprehensive context and child measures: Multi-method and multi-reporter data collection includes Head Start attendance records, continuous air monitoring, pathogen detection, contextual surveys completed by center directors, teachers, and parents, and standardized social-emotional assessments (DECA) completed by classroom teachers. Head Start program records providing children's longer-term health data available through Health Insurance Portability and Accountability Act (HIPAA) authorization. {middle dot} Clustered/temporal complexity: Seasonal design accounts for variation over time but may introduce complexity in modeling temporal effects. {middle dot} Practical Implications: Study findings will have practical implications for Head Start and other ECE programs striving to maximize child attendance with cost effective strategies. Keywords: Early childhood; Head Start; indoor air quality (IAQ); air purifiers; filtration; ultraviolet germicidal irradiation; cluster randomized trial; absenteeism; environmental pathogens; DECA; cost-benefit analysis
Liang, L.; Zhang, S. X.; Lin, J. J.
Show abstract
The co-occurrence of per- and polyfluoroalkyl substances (PFAS) and volatile organic compounds (VOCs) in industrial environments poses complex toxicological risks that standard additive models fail to capture. This study elucidates a novel "metabolic blockade" mechanism wherein PFAS competitively inhibits the renal excretion of VOC metabolites, thereby amplifying neurotoxic burdens. Utilizing a Double Machine Learning (DML) framework on data from National Health and Nutrition Examination Survey (2005-2020), we analyzed a final intersectional cohort of 1,975 participants. We identified a robust inhibition of VOC metabolite clearance by serum PFAS. Specifically, PFNA significantly suppressed the excretion of the benzene metabolite URXPMA (Causal {beta}TMLE = -0.219, p < 0.001), with efficacy dependent on perfluorinated chain length. Molecular docking simulations revealed the biophysical basis of this antagonism: long-chain PFNA exhibited superior binding affinity to the Organic Anion Transporter 1 (OAT1) ({Delta}G = -6.333 kcal/mol) compared to native VOC metabolites ({Delta}G = -4.957 kcal/mol), confirming high-affinity competitive inhibition at the renal interface. In a neurocognitive sub-cohort (N = 1,200), this interference translated into functional synergism; high-PFNA exposure magnified VOC-associated cognitive impairment by 1.5-fold and significantly exacerbated the negative association between VOC burden and processing speed ({beta}int = -0.263, p = 0.004). These findings define PFAS as a "metabolic amplifier" of co-contaminant toxicity, necessitating a paradigm shift toward mixture-based hazardous material regulations that account for transporter-level interactions.
Belvis, F.; Vicente-Castellvi, E.; Verdaguer, S.; Gutierrez-Zamora, M.; Benach, J.; Bodin, T.; Gevaert, J.; Girardi, S.; Harris, J.; Ilsoe, A.; Kokkinen, L.; Larsen, T. P.; Lee, S.; Lundh, F.; Mangot-Sala, L.; Matilla-Santander, N.; Merecz-Kot, D.; Nurmi, H.; Warhurst, C.; Julia, M.
Show abstract
Purpose: The GIG-OSH cohort was established to investigate the impact of digital platform work on occupational safety and health (OSH), working and employment conditions, and health in seven countries in Europe. Participants: The cohort comprises 3,945 digital platform workers from seven European countries. The sample includes both web-based workers (e.g., micro-tasking, freelance design) and on-location workers (e.g., delivery, transport). Participants were recruited using non-probabilistic sampling strategies tailored to national contexts, including social media advertising, recruitment through micro-task platforms, and on-site field outreach. Multidimensional data have been collected through online surveys (implemented via REDCap) covering sociodemographic characteristics, working and employment conditions, psychosocial risks, algorithmic management, and physical and mental health indicators. Findings to date: Participants had a mean age of 32.6 years at baseline (SD 10.4), and the majority are male (58.8%), with a higher concentration of migrants in on-location tasks (62.2%) compared to web-based tasks (48.8%). Regarding educational attainment, 55.4% of the total cohort holds a tertiary degree, reaching 64.4% among web-based workers. Platform work intensity varies significantly: on-location workers averaged 85.4 hours of work in the last month, while web-based workers averaged 47.0 hours. Mean income from platform work as a percentage of the national median was 20.6% (SD 22.2). The mean WHO-5 Well-Being Index score was 58.7 (SD 20.3), which is notably lower than the European general population average (69.4), indicating poorer mental health outcomes among cohort members. Future plans: The GIG-OSH cohort represents the first large-scale, longitudinal study examining occupational safety and health among digital platform workers across multiple European countries. Future waves will prioritize developing precise tools to measure hourly earnings and unpaid waiting time. Future research should aim to include underrepresented subgroups, such as medical and domestic care workers, and explore potential linkage with administrative records to evaluate long-term health trajectories and the impact of new EU labour regulations.